Finite Sample Based Mutual Information

نویسندگان

چکیده

Mutual information is a popular metric in machine learning. In case of discrete target variable and continuous feature the mutual can be calculated as sum-integral weighted log likelihood ratio joint marginal density distributions. However, practice true distributions are unavailable only finite sample population given. this paper, we propose novel method for calculating variables using population. The proposed based on approximating underlying distribution Kernel Density Estimation. Unlike previous kernel-based approaches estimating information, our calculates directly integral involved formula. Numerical experiments demonstrate that produces more accurate results than currently used selection approaches. addition, demonstrates substantially faster computation times benchmark methods.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On Classification of Bivariate Distributions Based on Mutual Information

Among all measures of independence between random variables, mutual information is the only one that is based on information theory. Mutual information takes into account of all kinds of dependencies between variables, i.e., both the linear and non-linear dependencies. In this paper we have classified some well-known bivariate distributions into two classes of distributions based on their mutua...

متن کامل

A Finite-Sample, Distribution-Free, Probabilistic Lower Bound on Mutual Information

For any memoryless communication channel with a binary-valued input and a one-dimensional real-valued output, we introduce a probabilistic lower bound on the mutual information given empirical observations on the channel. The bound is built on the Dvoretzky-Kiefer-Wolfowitz inequality and is distribution free. A quadratic time algorithm is described for computing the bound and its corresponding...

متن کامل

Mutual Information Based Gesture Recognition

Proliferation of gestural interfaces necessitates the creation of robust gesture recognition systems. A novel technique using Mutual Information to classify gestures in a recognition system is presented. As this technique is based on well-known information theory metrics the underlying operation is not as complex as many other techniques which allows for this technique to be easily implemented....

متن کامل

Mutual information-based context quantization

Context-based lossless coding suffers in many cases from the so-called context dilution problem, which arises when, in order to model high-order statistic dependencies among data, a large number of contexts is used. In this case the learning process cannot be fed with enough data, and so the probability estimation is not reliable. To avoid this problem, state-of-the-art algorithms for lossless ...

متن کامل

reducing interpolation artifacts for mutual information based image registration

medical image registration methods which use mutual information as similarity measure have been improved in recent decades. mutual information is a basic concept of information theory which indicates the dependency of two random variables (or two images). in order to evaluate the mutual information of two images their joint probability distribution is required. several interpolation methods, su...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Access

سال: 2021

ISSN: ['2169-3536']

DOI: https://doi.org/10.1109/access.2021.3107031